Complexity of Neural Network Approximation with Limited Information: A Worst Case Approach
نویسندگان
چکیده
منابع مشابه
Complexity of Neural Network Approximation with Limited Information: A Worst Case Approach
In neural network theory the complexity of constructing networks to approximate input-output (i-o) functions is of interest. We study this in the more general context of approximating elements f of a normed space F using partial information about f . We assume information about f and the size of the network are limited, as is typical in radial basis function networks. We show complexity can be ...
متن کاملA New Approach for Investigating the Complexity of Short Term EEG Signal Based on Neural Network
Background and purpose: The nonlinear quality of electroencephalography (EEG), like other irregular signals, can be quantified. Some of these values, such as Lyapunovchr('39')s representative, study the signal path divergence and some quantifiers need to reconstruct the signal path but some do not. However, all of these quantifiers require a long signal to quantify the signal complexity. Mate...
متن کاملWorst Case Complexity of Weighted Approximation and Integration over Rd
We study the worst case complexity of weighted approximation and integration for functions deened over R d. We assume that the functions have all partial derivatives of order up to r uniformly bounded in a weighted L p-norm for a given weight function. The integration and the error for approximation are deened in a weighted sense for another given weight %. We present a necessary and suucient c...
متن کاملComparison of worst case errors in linear and neural network approximation
Sets of multivariable functions are described for which worst case errors in linear approximation are larger than those in approximation by neural networks. A theoretical framework for such a description is developed in the context of nonlinear approximation by fixed versus variable basis functions. Comparisons of approximation rates are formulated in terms of certain norms tailored to sets of ...
متن کاملSome comparisons of the worst-case errors in linear and neural network approximation
Worst-case errors in linear and neural-network approximation are investigated in a more general framework of fixed versus variable-basis approximation. Such errors are compared for balls in certain norms, tailored to sets of variablebasis functions. The tools for estimation of rates of variablebasis approximation are applied to sets of functions either computable by perceptrons with periodic or...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Complexity
سال: 2001
ISSN: 0885-064X
DOI: 10.1006/jcom.2001.0575